44 research outputs found

    Light scattering and transmission measurement using digital imaging for online analysis of constituents in milk

    Get PDF
    Milk is an emulsion of fat globules and casein micelles dispersed in an aqueous medium with dissolved lactose, whey proteins and minerals. Quantification of constituents in milk is important in various stages of the dairy supply chain for proper process control and quality assurance. In field-level applications, spectrophotometric analysis is an economical option due to the low-cost of silicon photodetectors, sensitive to UV/Vis radiation with wavelengths between 300 - 1100 nm. Both absorption and scattering are witnessed as incident UV/Vis radiation interacts with dissolved and dispersed constituents in milk. These effects can in turn be used to characterize the chemical and physical composition of a milk sample. However, in order to simplify analysis, most existing instrument require dilution of samples to avoid effects of multiple scattering. The sample preparation steps are usually expensive, prone to human errors and unsuitable for field-level and online analysis. This paper introduces a novel digital imaging based method of online spectrophotometric measurements on raw milk without any sample preparation. Multiple LEDs of different emission spectra are used as discrete light sources and a digital CMOS camera is used as an image sensor. The extinction characteristic of samples is derived from captured images. The dependence of multiple scattering on power of incident radiation is exploited to quantify scattering. The method has been validated with experiments for response with varying fat concentrations and fat globule sizes. Despite of the presence of multiple scattering, the method is able to unequivocally quantify extinction of incident radiation and relate it to the fat concentrations and globule sizes of samples

    Antimicrobial Activity of Ethanolic Extracts of Syzygium aromaticum and Allium sativum Against Food Associated Bacteria and Fungi

    Get PDF
    The successful control of food spoilage microorganisms require the use of indigenous antimicrobials in foods including certain botanical compounds that have been historically used for flavour enhancement as well as preservation. The present study was designed to evaluate the in vitro antimicrobial activity of ethanolic extracts of Syzygium aromaticum (clove) and Allium sativum (garlic) against Gram-positive and Gram-negative food associated bacteria (Bacillus subtilis, B. megaterium, B. polymyxa, B. sphaericus, Staphylococcus aureus and Escherichia coli) and molds (Penicillium oxalicum, Aspergillus flavus, A. luchuensis, Rhizopus stolonifer, Scopulariopsis sp. and Mucor sp.) assayed by agar well diffusion method and poisoned food technique, respectively. Clove extract showed better antimicrobial activity than the garlic extract. The zone of inhibition in clove ethanolic extract against all the food associated bacteria was in the range of 25mm to 32mm and in molds the percent mycelial growth inhibition ranged from 70% to 100%. The growth inhibition zone in garlic ethanolic extract against bacteria was in the range of 20mm to 31mm and in molds the percent mycelial growth inhibition ranged between 20% and 50%. The clove ethanolic extract exhibited the maximum zone of inhibition against E. coli whereas garlic ethanolic extract showed maximum activity against B. subtilis. Both the extracts exhibited maximum percent mycelial growth inhibition against R. stolonifer. However garlic extract was not effective against P. oxalicum. The MIC values of clove ethanolic extract for different bacterial isolates ranged from 5.0mg/ml to 20mg/ml and 10 mg/ml to 20mg/ml against molds. The MIC values of garlic ethanolic extract for different bacterial and fungal isolates ranged from 10 mg/ml to 20mg/ml. The value of MBC and MFC equaled the MIC. Based on this finding, it may be suggested that these extracts may be used as natural antimicrobial additives to reclaim the shelf-life of foods

    Assessment of Thrombotic and Non-thrombotic Troponin Elevation (NTTE)

    Get PDF
    AIMS : Assessment of thrombotic and non-thrombotic troponin elevation (NTTE). OBJECTIVES : 1) To correlate cardiac troponin T (cTnT) levels in critically ill non-cardiac patients (CINCP) and acute coronary syndrome (ACS) patients with duration of hospitalization, in-hospital and 30 days mortality. 2) To describe the range of troponin elevation, positive and negative predictors for NTTE in CINCP stratified according to APACHE II score. METHODS : This is an observational cross sectional study done in department of Cardiology and Medicine (Critical Care division). 100 patients with ACS admitted in CCU/CPU and 100 CINCP (APACHE II score >12) with elevated cTnT levels admitted to MICU/MHDU were included in the study. Baseline sample of troponin was send within 24 hours of admission. Second set of cTnT and CKMB sample was send within 24 hours of first sample. APACHE II score was calculated with-in 24 hours of admission in MICU/ MHDU based on worst value up-to that point. Constitutional symptoms, risk factors for coronary artery disease, ECG, Echocardiography & renal parameters were documented. The data for all groups will be expressed as mean ± SD. Descriptive analysis was used to find out number and percentages. Categorical values were reported as proportion and percentages and it was compared using Chi-Square test. For continuous variables, mean with standard deviation was used and it was compared using independent sample t-test if normal and Mann-Whitney U test if not normal in Univariate analysis. The variables which were significant at Univariate analysis were taken for Multivariate analysis. Survival analysis was used to find association between in-hospital mortality and troponin levels in thrombotic ACS group and NTTE group. RESULTS : In this study troponin levels measured at baseline and within 24 hours of admission were significantly higher in patients who were admitted to MICU with ACS than with NTTE (373.04±1124.51 vs. 289.16±1062.26 and 743.31±1303.90 vs. 258.62± 891.41 respectively; p value= 0.05 and 0.000 respectively). When a troponin level of >300 ng/L was considered along with electrocardiographic and echocardiography abnormality in the presence of chest pain, acute coronary syndrome could be diagnosed with a sensitivity of 100%, specificity of 77.59%, and Positive Predictive value of 76.36%. Critically ill patients with high troponin levels had early in-hospital mortality. (Pearson co-relation coefficient = -0.068 and -0.072; p value 0.501 and 0.479 respectively). In MICU group there was a positive trend toward increased mortality or re-hospitalization ≤ 30 days after discharge with elevated troponin T levels at baseline and within 24 hours of the first sample but this was not statistically significant (p value = 0.157 and 0.564). We found sepsis (77.59%), ARF (68.97%), anemia (43.10%), shock (31.03%), pneumonia (25.86%), ARDS (20.69%), myocarditis (12.07%), CHF (10.34%), tachyarrhythmia (6.9%), CPR (6.9%) as common causes associated with NTTE. CONCLUSION : The study showed that baseline troponin level was significantly associated with in-hospital mortality. In CINCP patients with troponin value >300 ng/L along with chest pain, ECG and ECHO abnormalities; acute coronary syndrome could be diagnosed with reasonable accuracy. The commonest causes of NTTE were renal failure, anemia, sepsis and septic shock

    Towards a Query Optimizer for Text-Centric Tasks

    Get PDF
    Text is ubiquitous and, not surprisingly, many important applications rely on textual data for a variety of tasks. As a notable example, information extraction applications derive structured relations from unstructured text; as another example, focused crawlers explore the Web to locate pages about specific topics. Execution plans for text-centric tasks follow two general paradigms for processing a text database: either we can scan, or “crawl,” the text database or, alternatively, we can exploit search engine indexes and retrieve the documents of interest via carefully crafted queries constructed in task-specific ways. The choice between crawl- and query-based execution plans can have a substantial impact on both execution time and output “completeness” (e.g., in terms of recall). Nevertheless, this choice is typically ad hoc and based on heuristics or plain intuition. In this article, we present fundamental building blocks to make the choice of execution plans for text-centric tasks in an informed, cost-based way. Towards this goal, we show how to analyze query- and crawl-based plans in terms of both execution time and output completeness. We adapt results from random-graph theory and statistics to develop a rigorous cost model for the execution plans. Our cost model reflects the fact that the performance of the plans depends on fundamental task-specific properties of the underlying text databases. We identify these properties and present efficient techniques for estimating the associated parameters of the cost model.We also present two optimization approaches for text-centric tasks that rely on the cost-model parameters and select efficient execution plans. Overall, our optimization approaches help build efficient execution plans for a task, resulting in significant efficiency and output completeness benefits. We complement our results with a large-scale experimental evaluation for three important text-centric tasks and over multiple real-life data sets.NYU, Stern School of Business, IOMS Department, Center for Digital Economy Researc

    Towards a Query Optimizer for Text-Centric Tasks

    Get PDF
    Text is ubiquitous and, not surprisingly, many important applications rely on textual data for a variety of tasks. As a notable example, information extraction applications derive structured relations from unstructured text; as another example, focused crawlers explore the web to locate pages about specific topics. Execution plans for text-centric tasks follow two general paradigms for processing a text database: either we can scan, or "crawl," the text database or, alternatively, we can exploit search engine indexes and retrieve the documents of interest via carefully crafted queries constructed in task-specific ways. The choice between crawl- and query-based execution plans can have a substantial impact on both execution time and output "completeness" (e.g., in terms of recall). Nevertheless, this choice is typically ad-hoc and based on heuristics or plain intuition. In this article, we present fundamental building blocks to make the choice of execution plans for text-centric tasks in an informed, cost-based way. Towards this goal, we show how to analyze query- and crawl-based plans in terms of both execution time and output completeness. We adapt results from random-graph theory and statistics to develop a rigorous cost model for the execution plans. Our cost model reflects the fact that the performance of the plans depends on fundamental task-specific properties of the underlying text databases. We identify these properties and present efficient techniques for estimating the associated parameters of the cost model. We also present two optimization approaches for text-centric tasks that rely on the cost-model parameters and select efficient execution plans. Overall, our optimization approaches help build efficient execution plans for a task, resulting in significant efficiency and output completeness benefits. We complement our results with a large-scale experimental evaluation for three important text-centric tasks and over multiple real-life data sets.Information Systems Working Papers Serie

    Local Viscosity Control Printing for High-Throughput Additive Manufacturing of Polymers

    Get PDF
    Fused deposition modeling's (FDM) throughput is limited by process physics as well as practical considerations favoring single-head polymer extrusion. To expedite the thermoplastic additive manufacturing process, we propose a parallelized material deposition process called local viscosity control (LVC) additive manufacturing. LVC prints an entire layer in one step by selectively modulating the viscosity of polymer feedstock in contact with a heated wire mesh. Layers of molten polymer are contact printed, with the relative motion between the wire mechanism and a build plate allowing wetting and surface tension to transfer selectively heated, lower viscosity regions of polymer to a fixed substrate. Experiments demonstrate the viability of this process using a single cell depositing layered polycarbonate pixels. Theoretical analysis shows this process may offer similar capabilities in resolution to conventional FDM with a significantly higher production rate for commonly available input power

    Towards a Query Optimizer for Text-Centric Tasks

    Get PDF
    Text is ubiquitous and, not surprisingly, many important applications rely on textual data for a variety of tasks. As a notable example, information extraction applications derive structured relations from unstructured text; as another example, focused crawlers explore the web to locate pages about specific topics. Execution plans for text-centric tasks follow two general paradigms for processing a text database: either we can scan, or "crawl," the text database or, alternatively, we can exploit search engine indexes and retrieve the documents of interest via carefully crafted queries constructed in task-specific ways. The choice between crawl- and query-based execution plans can have a substantial impact on both execution time and output "completeness" (e.g., in terms of recall). Nevertheless, this choice is typically ad-hoc and based on heuristics or plain intuition. In this article, we present fundamental building blocks to make the choice of execution plans for text-centric tasks in an informed, cost-based way. Towards this goal, we show how to analyze query- and crawl-based plans in terms of both execution time and output completeness. We adapt results from random-graph theory and statistics to develop a rigorous cost model for the execution plans. Our cost model reflects the fact that the performance of the plans depends on fundamental task-specific properties of the underlying text databases. We identify these properties and present efficient techniques for estimating the associated parameters of the cost model. We also present two optimization approaches for text-centric tasks that rely on the cost-model parameters and select efficient execution plans. Overall, our optimization approaches help build efficient execution plans for a task, resulting in significant efficiency and output completeness benefits. We complement our results with a large-scale experimental evaluation for three important text-centric tasks and over multiple real-life data sets.Information Systems Working Papers Serie

    Appraising the therapeutical potentials of Alchornea laxiflora (Benth.) Pax & K. Hoffm., an underexplored medicinal herb: A systematic review

    Get PDF
    Ethnopharmacological relevance:Alchornea laxiflora (Benth.) Pax & K. Hoffm. (Euphorbiaceae) is an important traditional medicinal plant grown in tropical Africa. The stem, leaves, and root have been widely used in the folk medicine systems in Nigeria, Cameroon, South Africa, and Ghana to treat various ailments, including inflammatory, infectious, and central nervous system disorders, such as anxiety and epilepsy.Material and methods: The scientific name of the plant was validated using the “The Plant List,” “Kew Royal Botanic Gardens,” and Tropicos Nomenclatural databases. The literature search on A. laxiflora was performed using electronic search engines and databases such as Google scholar, ScienceDirect, PubMed, AJOL, Scopus, and Mendeley.Results: To the best of our knowledge, no specific and detailed review has been reported on A. laxiflora. Consequently, this review provides an up-to-date systematic presentation on ethnobotany, phytoconstituents, pharmacological activities, and toxicity profiles of A. laxiflora. Phytochemical investigations disclosed the presence of important compounds, such as alkaloids, flavonoids, phenolics, terpenoids, and fatty acids. Furthermore, various pharmacological activities and traditional uses reported for this botanical drug were discussed comprehensively.Conclusion: This systemic review presents the current status and perspectives of A. laxiflora as a potential therapeutic modality that would assist future researchers in exploring this African botanical drug as a source of novel drug candidates for varied diseases

    Quantification of joint mobility limitation in adult type 1 diabetes

    Get PDF
    AimsDiabetic cheiroarthropathies limit hand mobility due to fibrosis and could be markers of a global profibrotic trajectory. Heterogeneity in definitions and lack of a method to measure it complicate studying associations with organ involvement and treatment outcomes. We measured metacarpophalangeal (MCP) joint extension as a metric and describe magnetic resonance (MR) imaging determinants of MCP restriction.MethodsAdults with type 1 diabetes were screened for hand manifestations using a symptom questionnaire, clinical examination, and function [Duruoz hand index (DHI) and grip strength]. Patients were segregated by mean MCP extension (<20°, 20°–40°, 40°–60°, and >60°) for MR imaging (MRI) scanning. Patients in the four groups were compared using ANOVA for clinical features and MRI tissue measurements (tenosynovial, skin, and fascia thickness). We performed multiple linear regression for determinants of MCP extension.ResultsOf the 237 patients (90 men), 79 (33.8%) with cheiroarthropathy had MCP extension limitation (39° versus 61°, p < 0.01). Groups with limited MCP extension had higher DHI (1.9 vs. 0.2) but few (7%) had pain. Height, systolic blood pressure, and nephropathy were associated with mean MCP extension. Hand MRI (n = 61) showed flexor tenosynovitis in four patients and median neuritis in one patient. Groups with MCP mobility restriction had the thickest palmar skin; tendon thickness or median nerve area did not differ. Only mean palmar skin thickness was associated with MCP extension angle on multiple linear regression.ConclusionJoint mobility limitation was quantified by restricted mean MCP extension and had structural correlates on MRI. These can serve as quantitative measures for future associative and interventional studies

    CoNIC Challenge: Pushing the Frontiers of Nuclear Detection, Segmentation, Classification and Counting

    Get PDF
    Nuclear detection, segmentation and morphometric profiling are essential in helping us further understand the relationship between histology and patient outcome. To drive innovation in this area, we setup a community-wide challenge using the largest available dataset of its kind to assess nuclear segmentation and cellular composition. Our challenge, named CoNIC, stimulated the development of reproducible algorithms for cellular recognition with real-time result inspection on public leaderboards. We conducted an extensive post-challenge analysis based on the top-performing models using 1,658 whole-slide images of colon tissue. With around 700 million detected nuclei per model, associated features were used for dysplasia grading and survival analysis, where we demonstrated that the challenge's improvement over the previous state-of-the-art led to significant boosts in downstream performance. Our findings also suggest that eosinophils and neutrophils play an important role in the tumour microevironment. We release challenge models and WSI-level results to foster the development of further methods for biomarker discovery
    corecore